Channel Coding LP Decoding and Compressed Sensing LP Decoding: Further Connections∗
نویسندگان
چکیده
Channel coding linear programming decoding (CCLPD) and compressed sensing linear programming decoding (CSLPD) are two setups that are formally tightly related. Recently, a connection between CC-LPD and CS-LPD was exhibited that goes beyond this formal relationship. The main ingredient was a lemma that allowed one to map vectors in the nullspace of some zero-one measurement matrix into vectors of the fundamental cone defined by that matrix. The aim of the present paper is to extend this connection along several directions. In particular, the above-mentioned lemma is extended from real measurement matrices where every entry is equal to either zero or one to complex measurement matrices where the absolute value of every entry is a non-negative integer. Moreover, this lemma and its generalizations are used to translate performance guarantees from CC-LPD to CS-LPD. In addition, the present paper extends the formal relationship between CC-LPD and CS-LPD with the help of graph covers. First, this graph-cover viewpoint is used to obtain new connections between, on the one hand, CC-LPD for binary paritycheck matrices, and, on the other hand, CS-LPD for complex measurement matrices. Secondly, this graph-cover viewpoint is used to see CS-LPD not only as a well-known relaxation of some zero-norm minimization problem but (at least in the case of real measurement matrices with only zeros, ones, and minus ones) also as a relaxation of a problem we call the zero-infinity operator minimization problem.
منابع مشابه
Using Linear Programming to Decode Linear Codes
Given a linear code and observations from a noisy channel, the decoding problem is to determine the most likely (ML) codeword. We describe a method for approximate ML decoding of an arbitrary binary linear code, based on a linear programming (LP) relaxation that is defined by a factor graph or parity check representation of the code. The resulting LP decoder, which generalizes our previous work...
متن کاملLP Decoding
Linear programming (LP) relaxation is a common technique used to find good solutions to complex optimization problems. We present the method of “LP decoding”: applying LP relaxation to the problem of maximum-likelihood (ML) decoding. An arbitrary binary-input memoryless channel is considered. This treatment of the LP decoding method places our previous work on turbo codes [6] and low-density pa...
متن کاملLinear Programming Decoding for Non-Uniform Sources and for Binary Channels With Memory
Linear programming (LP) decoding of low-density parity-check codes was introduced by Feldman et al. in [1]. In his formulation it is assumed that communication takes place over a memoryless channel and that the source is uniform. Here, we extend the LP decoding paradigm by studying its application to scenarios with source nonuniformity and to decoding over channels with memory. We develop two d...
متن کاملAlgorithmes itératifs à faible complexité pour le codage de canal et le compressed sensing. (Low Complexity Iterative Algorithms for Channel Coding and Compressed Sensing)
Iterative algorithms are now widely used in all areas of signal processing and digital communications. In modern communication systems, iterative algorithms are used for decoding low-density parity-check (LDPC) codes, a popular class of error-correction codes that are now widely used for their exceptional error-rate performance. In a more recent field known as compressed sensing, iterative algo...
متن کاملDecoding linear codes via optimization and graph-based techniques
OF THE DISSERTATION Decoding Linear Codes via Optimization and Graph-Based TechniquesbyMohammad H. TaghaviDoctor of Philosophy in Electrical Engineering(Communications Theory and Systems)University of California San Diego, 2008Professor Paul H. Siegel, Chair Low-density parity-check (LDPC) codes have made it possible to communicate at in-formation rates very close to...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010